Welcome everybody. Today we are going to solve the last set of exercises and they are meant
to refresh your knowledge. Okay, so if you have any questions during the session you
can feel free to interrupt or you can write it in the chat but yeah as I mentioned in other sessions
yeah so the chat is more difficult for me to see because I am focusing on the slides but yeah after
each set of exercises I will be checking the chat and replying to the questions so we have eight
exercises and each of them is composed of at least two sub exercises and yeah after each of the
subsets I will check the chat to check your questions but feel free to interrupt at any time.
Okay so there is no okay hello yes this session is going to be uploaded and stood on
and so well the slides are going to be uploaded and stood on and the video the recording of the
session is going to be uploaded in FA-UTB. So the first exercise is about Bayesian classifier
and the first question is what is the difference between discriminative and generative modeling?
So yeah in both cases we want to know the the posteriors but in the case of the generative
modeling the you have to find the to find out the posteriors you first do the modeling and estimation
of the priors and and of the class conditional and in the discriminative modeling you estimate
directly the the posteriors. So in the case of the generative modeling in more practical terms
we can say that what is being learned by the classifier is the probability distributions
of the data and in the case of the discriminative is the what you are learning is the decision
boundary. An example of this of the ones of the discriminative we have the the case of SVM
perceptron for example as well and linear regression it's another.
For the case of generative one the generative modeling an example is the
GMM classifier the one that we saw in the programming exercises and also in the last
a theoretical session about GMMs where we estimated the for each class we estimated the GMM of each
of the classes and then we were able to classify the samples depending on
of which one which sample which a GMM model better the data and yet another generative
modeling example would be a Bayesian class the Bayesian classifier.
So the next question is what is the decision rule of the Bayesian classifier and first
let's just remember quickly the Bayesian rule so we have that the joint probabilities
and given by either this or these expressions so then if we separate the posterior and then we end
up with this expression and then if we replace the this expression this the posterior here is the
posterior we replace it here in the next line and then from here to here we remove it because
this term does not depend on y and also it because it is constant for all of possible
values of y so we can remove it and from here to here we only went through like the properties
of the log function and this is the final the final Bayesian rule the Bayesian decision rule.
And the next question is simplify the decision rule if there is no prior knowledge about the
occurrence of the classes available so if there is no prior knowledge then you remove the
this term the the prior term from from this and then you end up only with the
studies not much science in this exercise and then for the last exercise about Bayesian classifier
is about showing the optimality of the Bayesian classifier for the zero one loss
and so first let's define remember what is the definition of the zero one loss and it is this
one so zero if the prediction and the target value are the same and one otherwise.
And we also know that the Bayesian decision rule minimizes the average loss
so yeah and then if knowing this we can use the zero one loss function in the decision rule so
here in the decision and then we replace the definition of the average loss here
where L is the zero one loss clearly okay so if we if we from here to here we can use the
zero one loss clearly so if we if we from here to here we pass because we can say that this expression
is counting all all that the addition is of the one all probability all except the the one
corresponding to the actual class so when y belong when y is equal to the actual class
so the actual class is given by is without this the prime symbol symbol so and then so it's all but
the the posterior of the actual class so okay that's how we went from here to here
and then yeah one can be removed from the equation because it's from the arm from the
Zugänglich über
Offener Zugang
Dauer
00:39:05 Min
Aufnahmedatum
2021-02-15
Hochgeladen am
2021-02-15 15:17:19
Sprache
en-US